Sensorimotor Features of Self-Awareness in Multimodal LLMs Embedded in Robots

Emergence of Artificial Self-Awareness via Sensorimotor Integration and Memory

Published

May 25, 2025

Authors: I. Dellibarda Varela et al.
Published on Arxiv: 2025-05-25
Link: http://arxiv.org/abs/2505.19237v1
Institutions: Center for Automation and Robotics, Spanish National Research Council (CSIC-UPM), Madrid, Spain • Department of Electronic Engineering, University of Azuay, Cuenca, Ecuador
Keywords: Self-Awareness, Robotic Embodiment, Artificial Intelligence, Multimodal Large Language Models, Sensorimotor Integration, Episodic Memory, Gemini 2.0, Mobile Robot, Structural Equation Modeling, Embodied Cognition

Random Unsplash-style image

Self-awareness is fundamental for intelligent and autonomous behavior, but its mechanisms in artificial systems remain largely unexplored compared to the extensive research in humans and animals. Recent advances in large language models, especially multimodal variants, have shown human-like capacities in integrating sensory input, raising the question: can machine self-awareness arise through embodied sensorimotor experience?

To address this critical question, the authors designed a study employing multimodal large language models (MM-LLMs) embedded in autonomous robotics platforms. The study’s approach and main contributions are as follows:

Following the innovative methodology, the paper presents its results, outlining the effectiveness of sensorimotor and memory integration for self-awareness in embodied AI systems:

Finally, drawing together these findings, the authors reach several important conclusions about the foundations and future potential of embodied machine self-awareness: